Metric selection in fast dual forward-backward splitting
نویسندگان
چکیده
The performance of fast forward-backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved by two to three orders of magnitude. We also report comparable to superior performance compared to state-ofthe-art optimization software.
منابع مشابه
Variable Metric Forward-Backward Splitting with Applications to Monotone Inclusions in Duality∗
We propose a variable metric forward-backward splitting algorithm and prove its convergence in real Hilbert spaces. We then use this framework to derive primal-dual splitting algorithms for solving various classes of monotone inclusions in duality. Some of these algorithms are new even when specialized to the fixed metric case. Various applications are discussed.
متن کاملA Splitting Algorithm for Coupled System of Primal-Dual Monotone Inclusions
We propose a splitting algorithm for solving a coupled system of primal-dual monotone inclusions in real Hilbert spaces. The proposed algorithm has a structure identical to that of the forward-backward algorithm with variable metric. The operators involved in the problem formulation are used separately in the sense that single-valued operators are used individually and approximately in the forw...
متن کاملA Field Guide to Forward-Backward Splitting with a FASTA Implementation
Non-differentiable and constrained optimization play a key role in machine learning, signal and image processing, communications, and beyond. For highdimensional minimization problems involving large datasets or many unknowns, the forward-backward splitting method (also known as the proximal gradient method) provides a simple, yet practical solver. Despite its apparent simplicity, the performan...
متن کاملComparative Approach to the Backward Elimination and for-ward Selection Methods in Modeling the Systematic Risk Based on the ARFIMA-FIGARCH Model
The present study aims to model systematic risk using financial and accounting variables. Accordingly, the data for 174 companies in Tehran Stock Exchange are extracted for the period of 2006 to 2016. First, the systematic risk index is estimated using the ARFIMA-FIGARCH model. Then, based on the research background, 35 affective financial and accounting variables are simultaneously used with t...
متن کاملProjected-gradient algorithms for generalized equilibrium seeking in Aggregative Games are preconditioned Forward-Backward methods
We show that projected-gradient methods for the distributed computation of generalized Nash equilibria in aggregative games are preconditioned forward-backward splitting methods applied to the KKT operator of the game. Specifically, we adopt the preconditioned forward-backward design, recently conceived by Yi and Pavel in the manuscript “A distributed primal-dual algorithm for computation of ge...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Automatica
دوره 62 شماره
صفحات -
تاریخ انتشار 2015